Strongly improved stability and faster convergence of temporal sequence learning by utilising input correlations only

نویسندگان

  • Bernd Porr
  • Florentin Wörgötter
چکیده

Currently all important, low-level, unsupervised network learning algorithms follow the paradigm of Hebb, where inputand output activity are correlated to change the connection strength of a synapse. However, as a consequence, classical Hebbian learning always carries a potentially destabilising autocorrelation term which is due to the fact that every input is in a weighted form reflected in the neuron’s output. This self-correlation can lead to positive feedback, where increasing weights will increase the output and vice versa, which may result in divergence. This can be avoided by different strategies like weight normalisation or weight saturation which, however, can cause different problems. Consequently, in most cases, high learning rates cannot be used for Hebbian learning leading to relatively slow convergence. Here we introduce a novel correlation based learning rule which is related to our ISO-learning rule (Porr and Wörgötter, 2003a), but replaces the derivative of the output in the learning rule with the derivative of the reflex input. Hence the new rule utilises input correlations only, effectively implementing strict heterosynaptic learning. This looks like a minor modification, but leads to dramatically improved properties. Elimination of the output from the learning rule removes the unwanted, destabilising autocorrelation term allowing us to use high learning rates. As a consequence we can mathematically show that the theoretical optimum of one-shot learning can be reached under ideal conditions with the new rule. This result is then tested against four different experimental setups and we will show that in all of them very few (and sometimes only one) learning experiences are needed to achieve the learning goal. As a consequence the new learning rule is up to 100 times faster and in general more stable than ISO-learning.

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

Strongly Improved Stability and Faster Convergence of Temporal Sequence Learning by Using Input Correlations Only

Currently all important, low-level, unsupervised network learning algorithms follow the paradigm of Hebb, where input and output activity are correlated to change the connection strength of a synapse. However, as a consequence, classical Hebbian learning always carries a potentially destabilizing autocorrelation term, which is due to the fact that every input is in a weighted form reflected in ...

متن کامل

Strongly $[V_{2}, lambda_{2}, M, p]-$ summable double sequence spaces defined by orlicz function

In this paper we introduce strongly $left[  V_{2},lambda_{2},M,pright]-$summable double vsequence spaces via Orlicz function and examine someproperties of the resulting these spaces. Also we give natural relationshipbetween these spaces and $S_{lambda_{2}}-$statistical convergence.

متن کامل

Iterative learning identification and control for dynamic systems described by NARMAX model

A new iterative learning controller is proposed for a general unknown discrete time-varying nonlinear non-affine system represented by NARMAX (Nonlinear Autoregressive Moving Average with eXogenous inputs) model. The proposed controller is composed of an iterative learning neural identifier and an iterative learning controller. Iterative learning control and iterative learning identification ar...

متن کامل

Double Sequence Iterations for Strongly Contractive Mapping in Modular Space

In this paper, we consider double sequence iteration processes for strongly $rho$-contractive mapping in modular space. It is proved, these sequences, convergence strongly to a fixed point of the strongly $rho$-contractive mapping.

متن کامل

Learning input correlations through nonlinear temporally asymmetric Hebbian plasticity.

Triggered by recent experimental results, temporally asymmetric Hebbian (TAH) plasticity is considered as a candidate model for the biological implementation of competitive synaptic learning, a key concept for the experience-based development of cortical circuitry. However, because of the well known positive feedback instability of correlation-based plasticity, the stability of the resulting le...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

عنوان ژورنال:

دوره   شماره 

صفحات  -

تاریخ انتشار 2009